An Identity for Kernel Ridge Regression

نویسندگان

  • Fedor Zhdanov
  • Yuri Kalnishkan
چکیده

This paper derives an identity connecting the square loss of ridge regression in on-line mode with the loss of the retrospectively best regressor. Some corollaries about the properties of the cumulative loss of on-line ridge regression are also obtained.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sampling Circulant Matrix Approach: A Comparison of Recent Kernel Matrix Approximation Techniques in Ridge Kernel Regression

As part of a survey of state-of-the-art kernel approximation algorithms, we present a new sampling algorithm for circulant matrix construction to perform fast kernel matrix inversion in kernel ridge regression, comparing theoretical and experimental performance of that of multilevel circulant kernel approximation, incomplete Cholesky decomposition, and random features, all recent advances in th...

متن کامل

Nonlinear Forecasting with Many Predictors using Kernel Ridge Regression

This paper puts forward kernel ridge regression as an approach for forecasting with many predictors that are related nonlinearly to the target variable. In kernel ridge regression, the observed predictor variables are mapped nonlinearly into a high-dimensional space, where estimation of the predictive regression model is based on a shrinkage estimator to avoid overfitting. We extend the kernel ...

متن کامل

Kernel Ridge Estimator for the Partially Linear Model under Right-Censored Data

Objective: This paper aims to introduce a modified kernel-type ridge estimator for partially linear models under randomly-right censored data. Such models include two main issues that need to be solved: multi-collinearity and censorship. To address these issues, we improved the kernel estimator based on synthetic data transformation and kNN imputation techniques. The key idea of this paper is t...

متن کامل

Kernel ridge vs. principal component regression: minimax bounds and adaptability of regularization operators

Regularization is an essential element of virtually all kernel methods for nonparametric regressionproblems. A critical factor in the effectiveness of a given kernel method is the type of regularizationthat is employed. This article compares and contrasts members from a general class of regularizationtechniques, which notably includes ridge regression and principal component reg...

متن کامل

Kernel ridge vs. principal component regression: Minimax bounds and the qualification of regularization operators

Regularization is an essential element of virtually all kernel methods for nonparametric regression problems. A critical factor in the effectiveness of a given kernel method is the type of regularization that is employed. This article compares and contrasts members from a general class of regularization techniques, which notably includes ridge regression and principal component regression. We d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Theor. Comput. Sci.

دوره 473  شماره 

صفحات  -

تاریخ انتشار 2010